On Weak Base Learners for Boosting Regression and Classiication on Weak Base Learners for Boosting Regression and Classiication

نویسنده

  • Wenxin Jiang
چکیده

The most basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generate weak hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introducing a geometric concept called the angular span for the base hypothesis space. The exponential convergence rates of boosting algorithms are shown to be bounded below by essentially the angular spans. Suucient conditions for nonzero angular span are also given and validated for a wide class of regression and classiication systems. Abstract The most basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generate weak hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introducing a geometric concept called the angular span for the base hypothesis space. The exponential convergence rates of boosting algorithms are shown to be bounded below by essentially the angular spans. Suucient conditions for nonzero angular span are also given and validated for a wide class of regression and classiication systems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Some Results on Weakly Accurate Base Learners for Boosting Regression and Classification

One basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generatèweak' (or more appropriately, `weakly accurate') hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introdu...

متن کامل

On Weak Base Hypotheses and Their Implications

1 2 When studying the training error and the prediction error for boosting, it is often assumed that the hypotheses returned by the base learner are weakly accurate, or are able to beat a random guesser by a certain amount of diierence. It is has been an open question how much this diierence can be, whether it will eventually disappear in the boosting process or be bounded by a nite amount see,...

متن کامل

Boosting with the L 2 -loss: Regression and Classiication

This paper investigates a computationally simple variant of boosting, L 2 Boost, which is constructed from a functional gradient descent algorithm with the L 2-loss function. As other boosting algorithms, L 2 Boost uses many times in an iterative fashion a pre-chosen tting method, called the learner. Based on the explicit expression of reetting of residuals of L 2 Boost, the case with (symmetri...

متن کامل

Boosting

Boosting is a kind of ensemble methods which produce a strong learner that is capable of making very accurate predictions by combining rough and moderately inaccurate learners (which are called as base learners or weak learners). In particular, Boosting sequentially trains a series of base learners by using a base learning algorithm, where the training examples wrongly predicted by a base learn...

متن کامل

Large Time Behavior of Boosting

1 2 We exploit analogies between regression and classiication to study certain properties of boosting algorithms. A geometric concept called the angular span is deened and related to analogs of the VC dimension and the pseudo dimension of the regression and classiication systems, and to the assumption of the weak learner. The exponential convergence rates of boosting algorithms are shown to be ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000